Gemma/[Gemma_2]Function_Calling_with_Groq_Langchain.ipynb (252 lines of code) (raw):

{ "cells": [ { "cell_type": "markdown", "id": "Cast6XwmzEhJ", "metadata": { "id": "Cast6XwmzEhJ" }, "source": [ "##### Copyright 2025.\n", "\n", "# Gemma 2 Function Calling with LangChain and Groq\n", "\n", "This tutorial demonstrates how to perform **function calling** using the `gemma-2-9b-it` model hosted on [Groq Cloud](https://console.groq.com/) with **LangChain integration**.\n", "\n", "<table align=\"left\">\n", " <td>\n", " <a target=\"_blank\" href=\"https://colab.research.google.com/github/google-gemini/gemma-cookbook/blob/main/Gemma/[Gemma_2]Function_Calling_with_Groq_Langchain.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n", " </td>\n", "</table>" ] }, { "cell_type": "code", "execution_count": 1, "id": "h4YFKsPszEhL", "metadata": { "id": "h4YFKsPszEhL" }, "outputs": [], "source": [ "# Licensed under the Apache License, Version 2.0 (the \"License\");\n", "# you may not use this file except in compliance with the License.\n", "# You may obtain a copy of the License at\n", "#\n", "# http://www.apache.org/licenses/LICENSE-2.0\n", "#\n", "# Unless required by applicable law or agreed to in writing, software\n", "# distributed under the License is distributed on an \"AS IS\" BASIS,\n", "# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n", "# See the License for the specific language governing permissions and\n", "# limitations under the License." ] }, { "cell_type": "markdown", "id": "kiyjZSuozEhL", "metadata": { "id": "kiyjZSuozEhL" }, "source": [ "## Setup\n", "\n", "You will need:\n", "\n", "- A [Groq API key](https://console.groq.com/keys)\n", "- Python 3.10+\n", "- Colab secrets enabled to securely access the key." ] }, { "cell_type": "code", "execution_count": 2, "id": "mVV5QcohzEhM", "metadata": { "id": "mVV5QcohzEhM" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\u001b[?25l \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m0.0/126.7 kB\u001b[0m \u001b[31m?\u001b[0m eta \u001b[36m-:--:--\u001b[0m\r\u001b[2K \u001b[91m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m\u001b[91m╸\u001b[0m\u001b[90m━\u001b[0m \u001b[32m122.9/126.7 kB\u001b[0m \u001b[31m3.8 MB/s\u001b[0m eta \u001b[36m0:00:01\u001b[0m\r\u001b[2K \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m126.7/126.7 kB\u001b[0m \u001b[31m1.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n", "\u001b[?25h" ] } ], "source": [ "!pip install --quiet groq langchain langchain_groq" ] }, { "cell_type": "code", "execution_count": 3, "id": "ZUx4IfpozEhM", "metadata": { "id": "ZUx4IfpozEhM" }, "outputs": [], "source": [ "import os\n", "from google.colab import userdata\n", "\n", "# Load your Groq API key from Colab secrets\n", "os.environ[\"GROQ_API_KEY\"] = userdata.get(\"GROQ_API_KEY\")" ] }, { "cell_type": "markdown", "id": "3_Mr-eiqzEhM", "metadata": { "id": "3_Mr-eiqzEhM" }, "source": [ "## Define External Tools (Functions)\n", "\n", "We define a sample function that fetches weather information." ] }, { "cell_type": "code", "execution_count": 27, "id": "Mei_tmbxzEhN", "metadata": { "id": "Mei_tmbxzEhN" }, "outputs": [], "source": [ "from langchain.tools import Tool\n", "from langchain_groq import ChatGroq\n", "from langchain.agents import initialize_agent\n", "import re\n", "\n", "# Define a single-input weather tool\n", "def get_current_weather_tool(input_str: str) -> str:\n", " \"\"\"Get the current weather in a location.\"\"\"\n", " # Naive parsing: look for location and optional unit\n", " match = re.search(r\"in (\\w+)\", input_str.lower())\n", " unit_match = re.search(r\"(celsius|fahrenheit)\", input_str.lower())\n", "\n", " location = match.group(1) if match else \"unknown\"\n", " unit = unit_match.group(1) if unit_match else \"fahrenheit\"\n", " return f\"Final Answer: The weather in {location.title()} is 72° {unit.title()} and sunny.\"\n" ] }, { "cell_type": "markdown", "id": "CptR20G3zEhN", "metadata": { "id": "CptR20G3zEhN" }, "source": [ "## Initialize LangChain Agent with Gemma 2 on Groq\n", "\n", "We'll use `gemma-2-9b-it` via LangChain and bind our tool." ] }, { "cell_type": "code", "execution_count": 28, "id": "lPs8KrHrzEhO", "metadata": { "id": "lPs8KrHrzEhO" }, "outputs": [], "source": [ "get_current_weather = Tool(\n", " name=\"get_current_weather\",\n", " func=get_current_weather_tool,\n", " description=\"Get the current weather by passing a string like 'What's the weather in Paris in Celsius?'\",\n", " return_direct=True\n", ")\n", "\n", "# Initialize LLM\n", "llm = ChatGroq(model=\"gemma-2-9b-it\", temperature=0)\n", "\n", "# Initialize the agent (no need to bind tools for this setup)\n", "agent = initialize_agent(\n", " tools=[get_current_weather],\n", " llm=llm,\n", " agent=\"zero-shot-react-description\",\n", " verbose=True\n", ")\n" ] }, { "cell_type": "markdown", "id": "GW8ySM16zEhO", "metadata": { "id": "GW8ySM16zEhO" }, "source": [ "## Run Example Prompt\n", "\n", "We now run a natural language prompt that should trigger function calling." ] }, { "cell_type": "code", "execution_count": 29, "id": "hOEhiAj4zEhO", "metadata": { "id": "hOEhiAj4zEhO" }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "\n", "\n", "\u001b[1m> Entering new AgentExecutor chain...\u001b[0m\n", "\u001b[32;1m\u001b[1;3mThought: I need to get the current weather in Tokyo, and the user wants the temperature in Fahrenheit. I should use the get_current_weather function to get the current weather.\n", "\n", "Action: get_current_weather\n", "Action Input: What's the weather in Tokyo in Fahrenheit?\u001b[0m\n", "Observation: \u001b[36;1m\u001b[1;3mFinal Answer: The weather in Tokyo is 72° Fahrenheit and sunny.\u001b[0m\n", "\u001b[32;1m\u001b[1;3m\u001b[0m\n", "\n", "\u001b[1m> Finished chain.\u001b[0m\n", "Final Answer: The weather in Tokyo is 72° Fahrenheit and sunny.\n" ] } ], "source": [ "# Example usage\n", "response = agent.run(\"What's the weather in Tokyo in Fahrenheit?\")\n", "print(response)" ] }, { "cell_type": "markdown", "id": "CP_OfmvLzEhP", "metadata": { "id": "CP_OfmvLzEhP" }, "source": [ "## Conclusion\n", "\n", "In this notebook, you learned how to:\n", "\n", "- Connect Gemma 2 (`gemma-2-9b-it`) via Groq using LangChain\n", "- Define and bind external tools (functions)\n", "- Use natural prompts to trigger function calls via LangChain Agents\n", "\n", "This is a powerful building block for integrating APIs and logic into LLM workflows." ] } ], "metadata": { "colab": { "name": "[Gemma_2]Function_Calling_with_Groq_Langchain.ipynb", "toc_visible": true }, "kernelspec": { "display_name": "Python 3", "name": "python3" } }, "nbformat": 4, "nbformat_minor": 0 }